The paper looks at a scaled variant of the stochastic gradient descentalgorithm for the matrix completion problem. Specifically, we propose a novelmatrix-scaling of the partial derivatives that acts as an efficientpreconditioning for the standard stochastic gradient descent algorithm. Thisproposed matrix-scaling provides a trade-off between local and global secondorder information. It also resolves the issue of scale invariance that existsin matrix factorization models. The overall computational complexity is linearwith the number of known entries, thereby extending to a large-scale setup.Numerical comparisons show that the proposed algorithm competes favorably withstate-of-the-art algorithms on various different benchmarks.
展开▼